A lower - bound on ` 2 dimensionality reduction

نویسندگان

  • Michel X. Goemans
  • Petar Maymounkov
چکیده

A lower-bound on `2 dimensionality reduction The main focus of this lecture is a lower bound on the dimension when doing dimensionality reduction with -distortion in `1 and `2. In particular, it will be shown that there exist graphs requiring Ω(log n) dimensions to embed with any fixed desired distortion in Euclidean space. The main technical result is: Theorem 1 (Alon [1]). Let v1, . . . , vn+1 ∈ Rd and 1/ √ n ≤ < 1/3 be given, such that 1 ≤ ‖vi − vj‖ ≤ 1 + for all i 6= j ∈ [n + 1]. Then the subspace spanned by v1, . . . , vn+1 has dimension d = Ω (

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Johnson-Lindenstrauss Lemma Is Optimal for Linear Dimensionality Reduction

For any n > 1 and 0 < ε < 1/2, we show the existence of an n-point subset X of R such that any linear map from (X, `2) to ` m 2 with distortion at most 1 + ε must have m = Ω(min{n, ε−2 logn}). Our lower bound matches the upper bounds provided by the identity matrix and the Johnson-Lindenstrauss lemma [JL84], improving the previous lower bound of Alon [Alo03] by a log(1/ε) factor.

متن کامل

2D Dimensionality Reduction Methods without Loss

In this paper, several two-dimensional extensions of principal component analysis (PCA) and linear discriminant analysis (LDA) techniques has been applied in a lossless dimensionality reduction framework, for face recognition application. In this framework, the benefits of dimensionality reduction were used to improve the performance of its predictive model, which was a support vector machine (...

متن کامل

Cs 229r: Algorithms for Big Data Lecture October 13th 3 Main Section 3.1 Lower Bounds on Dimensionality Reduction 3.1.1 Lower Bounds on Distributional Johnson Lindenstrauss Lemma

The first lower bound was proved by Jayram and Woodruff [1] and then by Kane,Meka,Nelson [2]. The lower bound tells that any ( , δ)-DJL for , δ ∈ (0, 1/2) must have m = Ω(min{n, −2 log(1δ )}). The second proof builds on the following idea: Since for all x we have the probabilistic guarantee PΠ∼D ,δ [|‖Πx‖2−1| < max{ , 2}] < δ, then it is true also for any distribution over x. We are going to pi...

متن کامل

Dynamic Dimensionality Reduction and Similarity Distance Computation by Inner Product Approximations

Developing efficient ways for dimensionality reduction is crucial for the query performance in multimedia databases. For approximation queries a careful analysis must be performed on the approximation quality of the dimensionality reduction technique. Besides having the lower-bound property, we expect the techniques to have good quality of distance measures when the similarity distance between ...

متن کامل

Generalization Bounds for Supervised Dimensionality Reduction

We introduce and study the learning scenario of supervised dimensionality reduction, which couples dimensionality reduction and a subsequent supervised learning step. We present new generalization bounds for this scenario based on a careful analysis of the empirical Rademacher complexity of the relevant hypothesis set. In particular, we show an upper bound on the Rademacher complexity that is i...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006